10 research outputs found

    Remixing physical objects through tangible tools

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2011.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Cataloged from student submitted PDF version of thesis.Includes bibliographical references (p. 147-164).In this document we present new tools for remixing physical objects. These tools allow users to copy, edit and manipulate the properties of one or more objects to create a new physical object. We already have these capabilities using digital media: we can easily mash up videos, music and text. However, it remains difficult to remix physical objects and we cannot access the advantages of digital media, which are nondestructive, scalable and scriptable. We can bridge this gap by both integrating 2D and 3D scanning technology into design tools and employing aordable rapid prototyping technology to materialize these remixed objects. In so doing, we hope to promote copying as a tool for creation. This document presents two tools, CopyCAD and KidCAD, the first designed for makers and crafters, the second for children. CopyCAD is an augmented Computer Numerically Controlled (CNC) milling machine which allows users to copy arbitrary real world object geometry into 2D CAD designs at scale through the use of a camera-projector system. CopyCAD gathers properties from physical objects, sketches and touch interactions directly on a milling machine, allowing novice users to copy parts of real world objects, modify them and create a new physical part. KidCAD is a sculpting interface built on top of a gel-based realtime 2.5D scanner. It allows children to stamp objects into the block of gel, which are scanned in realtime, as if they were stamped into clay. Children can use everyday objects, their hands and tangible tools to design new toys or objects that will be 3D printed. This work enables novice users to easily approach designing physical objects by copying from other objects and sketching new designs. With increased access to such tools we hope that a wide range of people will be empowered to create their own objects, toys, tools and parts.by Sean Follmer.S.M

    LineFORM: Actuated Curve Interfaces for Display, Interaction, and Constraint

    Get PDF
    In this paper we explore the design space of actuated curve interfaces, a novel class of shape changing-interfaces. Physical curves have several interesting characteristics from the perspective of interaction design: they have a variety of inherent affordances; they can easily represent abstract data; and they can act as constraints, boundaries, or borderlines. By utilizing such aspects of lines and curves, together with the added capability of shape-change, new possibilities for display, interaction and body constraint are possible. In order to investigate these possibilities we have implemented two actuated curve interfaces at different scales. LineFORM, our implementation, inspired by serpentine robotics, is comprised of a series chain of 1DOF servo motors with integrated sensors for direct manipulation. To motivate this work we present various applications such as shape changing cords, mobiles, body constraints, and data manipulation tools

    deForm: An interactive malleable surface for capturing 2.5D arbitrary objects, tools and touch

    Get PDF
    We introduce a novel input device, deForm, that supports 2.5D touch gestures, tangible tools, and arbitrary objects through real-time structured light scanning of a malleable surface of interaction. DeForm captures high-resolution surface deformations and 2D grey-scale textures of a gel surface through a three-phase structured light 3D scanner. This technique can be combined with IR projection to allow for invisible capture, providing the opportunity for co-located visual feedback on the deformable surface. We describe methods for tracking fingers, whole hand gestures, and arbitrary tangible tools. We outline a method for physically encoding fiducial marker information in the height map of tangible tools. In addition, we describe a novel method for distinguishing between human touch and tangible tools, through capacitive sensing on top of the input surface. Finally we motivate our device through a number of sample applications

    Physical Telepresence: Shape Capture and Display for Embodied, Computer-mediated Remote Collaboration

    Get PDF
    We propose a new approach to Physical Telepresence, based on shared workspaces with the ability to capture and remotely render the shapes of people and objects. In this paper, we describe the concept of shape transmission, and propose interaction techniques to manipulate remote physical objects and physical renderings of shared digital content. We investigate how the representation of user's body parts can be altered to amplify their capabilities for teleoperation. We also describe the details of building and testing prototype Physical Telepresence workspaces based on shape displays. A preliminary evaluation shows how users are able to manipulate remote objects, and we report on our observations of several different manipulation techniques that highlight the expressive nature of our system.National Science Foundation (U.S.). Graduate Research Fellowship Program (Grant No. 1122374

    inFORM: Dynamic Physical Affordances and Constraints through Shape and Object Actuation

    Get PDF
    Past research on shape displays has primarily focused on rendering content and user interface elements through shape output, with less emphasis on dynamically changing UIs. We propose utilizing shape displays in three different ways to mediate interaction: to facilitate by providing dynamic physical affordances through shape change, to restrict by guiding users with dynamic physical constraints, and to manipulate by actuating physical objects. We outline potential interaction techniques and introduce Dynamic Physical Affordances and Constraints with our inFORM system, built on top of a state-of-the-art shape display, which provides for variable stiffness rendering and real-time user input through direct touch and tangible interaction. A set of motivating examples demonstrates how dynamic affordances, constraints and object actuation can create novel interaction possibilities.National Science Foundation (U.S.). Graduate Research Fellowship (Grant 1122374)Swedish Research Council (Fellowship)Blanceflor Foundation (Scholarship

    Kinetic Blocks: Actuated Constructive Assembly for Interaction and Display

    Get PDF
    Pin-based shape displays not only give physical form to digital information, they have the inherent ability to accurately move and manipulate objects placed on top of them. In this paper we focus on such object manipulation: we present ideas and techniques that use the underlying shape change to give kinetic ability to otherwise inanimate objects. First, we describe the shape display's ability to assemble, disassemble, and reassemble structures from simple passive building blocks through stacking, scaffolding, and catapulting. A technical evaluation demonstrates the reliability of the presented techniques. Second, we introduce special kinematic blocks that are actuated and sensed through the underlying pins. These blocks translate vertical pin movements into other degrees of freedom like rotation or horizontal movement. This interplay of the shape display with objects on its surface allows us to render otherwise inaccessible forms, like overhangs, and enables richer input and output

    Sublimate: State-Changing Virtual and Physical Rendering to Augment Interaction with Shape Displays

    Get PDF
    Recent research in 3D user interfaces pushes towards immersive graphics and actuated shape displays. Our work explores the hybrid of these directions, and we introduce sublimation and deposition, as metaphors for the transitions between physical and virtual states. We discuss how digital models, handles and controls can be interacted with as virtual 3D graphics or dynamic physical shapes, and how user interfaces can rapidly and fluidly switch between those representations. To explore this space, we developed two systems that integrate actuated shape displays and augmented reality (AR) for co-located physical shapes and 3D graphics. Our spatial optical see-through display provides a single user with head-tracked stereoscopic augmentation, whereas our handheld devices enable multi-user interaction through video seethrough AR. We describe interaction techniques and applications that explore 3D interaction for these new modalities. We conclude by discussing the results from a user study that show how freehand interaction with physical shape displays and co-located graphics can outperform wand-based interaction with virtual 3D graphics.National Science Foundation (U.S.) (Graduate Research Fellowship Grant 1122374

    Dynamic physical affordances for shape-changing and deformable user interfaces

    No full text
    Thesis: Ph. D., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2015.Cataloged from PDF version of thesis.Includes bibliographical references (pages 207-222).The world is filled with tools and devices designed to fit specific needs and goals, and their physical form plays an important role in helping users understand their use. These physical affordances provide products and interfaces with many advantages: they contribute to good ergonomics, allow users to attend to other tasks visually, and take advantage of embodied and distributed cognition by allowing users to offload mental computation spatially. However, devices today include more and more functionality, with increasingly fewer physical affordances, losing many of the advantages in expressivity and dexterity that our hands can provide. My research examines how we can apply shape-changing and deformable interfaces to address the lack of physical affordances in today's interactive products and enable richer physical interaction with general purpose computing interfaces. In this thesis, I introduce tangible interfaces that use their form to adapt to the functions and ways users want to interact with them. I explore two solutions: 1) creating Dynamic Physical Affordances through shape change and 2) user Improvised Physical Affordances through direct deformation and through appropriation of existing objects. Dynamic Physical Affordances can provide buttons and sliders on demand as an application changes, or even allow users to directly manipulate 3D models or data sets through physical handles which appear out of the data. Improvised Physical Affordances can allow users to squeeze, stretch, and deform input devices to fit their needs, creating the perfect game controller, or shaping a mobile phone around their wrist to form a bracelet. Novel technical solutions are needed to enable these new interaction techniques; this thesis describes techniques both for actuation and robust sensing for shape-changing and deformable interfaces. Finally, systems that utilize Dynamic Physical Affordances and Improvised Physical Affordances are evaluated to understand patterns of use and performance. My belief is that shape-changing UI will become increasingly available in the future, and this work begins to create a vocabulary and design space for more general-purpose interaction for shape-changing UI.by Sean Weston Follmer.Ph. D

    Jamming user interfaces

    No full text
    Malleable and organic user interfaces have the potential to enable radically new forms of interactions and expressiveness through flexible, free-form and computationally controlled shapes and displays. This work, specifically focuses on particle jamming as a simple, effective method for flexible, shape-changing user interfaces where programmatic control of material stiffness enables haptic feedback, deformation, tunable affordances and control gain. We introduce a compact, low-power pneumatic jamming system suitable for mobile devices, and a new hydraulic-based technique with fast, silent actuation and optical shape sensing. We enable jamming structures to sense input and function as interaction devices through two contributed methods for high-resolution shape sensing using: 1) index-matched particles and fluids, and 2) capacitive and electric field sensing. We explore the design space of malleable and organic user interfaces enabled by jamming through four motivational prototypes that highlight jamming's potential in HCI, including applications for tabletops, tablets and for portable shape-changing mobile devices.National Science Foundation (U.S.). Graduate Research Fellowship Program (Grant 1122374

    TRANSFORM

    No full text
    TRANSFORM fuses technology and design to celebrate the transformation from a piece of static furniture to a dynamic machine driven by streams of data and energy. TRANSFORM aims to inspire viewers with unexpected transformations, as well as the aesthetics of a complex machine in motion. This paper describes the concept, engine, product, and motion design of TRANSFORM, which was first exhibited at LEXUS DESIGN AMAZING 2014 MILAN in April 2014
    corecore